Variational Cumulant Expansions for Intractable Distributions Pi Erre Van De Laar Rwcp (real World Computing Partnership) Theoretical Foundation Snn (foundation for Neural Networks)

نویسنده

  • David Barber
چکیده

Intractable distributions present a common diiculty in inference within the proba-bilistic knowledge representation framework and variational methods have recently been popular in providing an approximate solution. In this article, we describe a perturbational approach in the form of a cumulant expansion which, to lowest order, recovers the standard Kullback-Leibler variational bound. Higher-order terms describe corrections on the variational approach without incurring much further computational cost. The relationship to other perturbational approaches such as TAP is also elucidated. We demonstrate the method on a particular class of undirected graphical models, Boltzmann machines, for which our simulation results conrm improved accuracy and enhanced stability during learning.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Variational Cumulant Expansions for Intractable Distributions

Intractable distributions present a common diiculty in inference within the proba-bilistic knowledge representation framework and variational methods have recently been popular in providing an approximate solution. In this article, we describe a perturbational approach in the form of a cumulant expansion which, to lowest order, recovers the standard Kullback-Leibler variational bound. Higher-or...

متن کامل

An Application of Linear Response Learning

Linear response is an approximation method for Boltzmann machines based on mean field theory. It is known that in the absence of hidden units this method can learn the network quite accurately with the costs of only one matrix inversion. We show that adding a flat distribution to the target can decrease the classification error. We apply linear response learning to a real world data set of digi...

متن کامل

On Structured Variational Approximations

The problem of approximating a probability distribution occurs frequently in many areas of applied mathematics including statistics communication theory machine learning and the theoretical analysis of complex systems such as neural networks Saul and Jordan have recently proposed a powerful method for e ciently ap proximating probability distributions known as structured variational approximati...

متن کامل

Discrete-valued Neural Networks Using Variational Inference

The increasing demand for neural networks (NNs) being employed on embedded devices has led to plenty of research investigating methods for training low precision NNs. While most methods involve a quantization step, we propose a principled Bayesian approach where we first infer a distribution over a discrete weight space from which we subsequently derive hardware-friendly low precision NNs. To t...

متن کامل

Discrete-valued Neural Networks Using Variational Inference

The increasing demand for neural networks (NNs) being employed on embedded devices has led to plenty of research investigating methods for training low precision NNs. While most methods involve a quantization step, we propose a principled Bayesian approach where we first infer a distribution over a discrete weight space from which we subsequently derive hardware-friendly low precision NNs. To t...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1999